Robust Kullback-Leibler Divergence and Universal Hypothesis Testing for Continuous Distributions

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust Kullback-Leibler Divergence and Universal Hypothesis Testing for Continuous Distributions

Universal hypothesis testing refers to the problem of deciding whether samples come from a nominal distribution or an unknown distribution that is different from the nominal distribution. Hoeffding’s test, whose test statistic is equivalent to the empirical Kullback-Leibler divergence (KLD), is known to be asymptotically optimal for distributions defined on finite alphabets. With continuous obs...

متن کامل

Kullback-Leibler Divergence Constrained Distributionally Robust Optimization

In this paper we study distributionally robust optimization (DRO) problems where the ambiguity set of the probability distribution is defined by the Kullback-Leibler (KL) divergence. We consider DRO problems where the ambiguity is in the objective function, which takes a form of an expectation, and show that the resulted minimax DRO problems can be formulated as a one-layer convex minimization ...

متن کامل

Kullback-Leibler Divergence Measure for Multivariate Skew-Normal Distributions

The aim of this work is to provide the tools to compute the well-known Kullback–Leibler divergence measure for the flexible family of multivariate skew-normal distributions. In particular, we use the Jeffreys divergence measure to compare the multivariate normal distribution with the skew-multivariate normal distribution, showing that this is equivalent to comparing univariate versions of these...

متن کامل

Characterizations and Kullback-Leibler Divergence of Gompertz Distributions

In this note, we characterize the Gompertz distribution in terms of extreme value distributions and point out that it implicitly models the interplay of two antagonistic growth processes. In addition, we derive a closed form expressions for the Kullback-Leibler divergence between two Gompertz Distributions. Although the latter is rather easy to obtain, it seems not to have been widely reported ...

متن کامل

Texture Similarity Measure Using Kullback-Leibler Divergence between Gamma Distributions

We propose a texture similarity measure based on the Kullback-Leibler divergence between gamma distributions (KLGamma). We conjecture that the spatially smoothed Gabor filter magnitude responses of some classes of visually homogeneous stochastic textures are gamma distributed. Classification experiments with disjoint test and training images, show that the KLGamma measure performs better than o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2019

ISSN: 0018-9448,1557-9654

DOI: 10.1109/tit.2018.2879057